MindLLM 1.3B is a 1.3 billion-parameter Transformer model jointly developed by the Beijing Engineering Research Center of Massive Language Information Processing and Cloud Computing Applications and the Southeast Institute of Information Technology, Beijing Institute of Technology, supporting Chinese and English dialogue generation.
Large Language Model
Transformers Supports Multiple Languages